Information and Entropy
نویسنده
چکیده
The general problem of inductive inference is to update from a prior probability distribution to a posterior distribution when new information becomes available. Bayes' rule is the natural way to update when the new information is in the form of data while Jaynes’ method of maximum entropy, MaxEnt, is designed to handle information in the form of constraints. However, the range of applicability of either method is limited: Bayes' rule can handle arbitrary priors and data, but not arbitrary constraints, and MaxEnt can handle arbitrary constraints (including data) but not arbitrary priors. We show that Skilling's method of induction leads to a unique general theory of inductive inference, the method of Maximum relative Entropy (M.E.). The M.E. method is designed for updating from arbitrary priors given information in the form of arbitrary constraints (including data). Four axioms (locality, coordinate invariance, consistency for independent subsystems, and consistency for large numbers) suffice to single out the logarithmic relative entropy as the unique tool for updating; other entropy functionals are ruled out although they might still be useful for other purposes. The M.E. method includes both MaxEnt and Bayes' rule as special cases and therefore it unifies the two themes of these workshops – the Maximum Entropy and the Bayesian methods – into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.
منابع مشابه
Tsallis Entropy and Conditional Tsallis Entropy of Fuzzy Partitions
The purpose of this study is to define the concepts of Tsallis entropy and conditional Tsallis entropy of fuzzy partitions and to obtain some results concerning this kind entropy. We show that the Tsallis entropy of fuzzy partitions has the subadditivity and concavity properties. We study this information measure under the refinement and zero mode subset relations. We check the chain rules for ...
متن کاملEvaluation of monitoring network density using discrete entropy theory
The regional evaluation of monitoring stations for water resources can be of great importance due to its role in finding appropriate locations for stations, the maximum gathering of useful information and preventing the accumulation of unnecessary information and ultimately reducing the cost of data collection. Based on the theory of discrete entropy, this study analyzes the density of rain gag...
متن کاملEntropy of infinite systems and transformations
The Kolmogorov-Sinai entropy is a far reaching dynamical generalization of Shannon entropy of information systems. This entropy works perfectly for probability measure preserving (p.m.p.) transformations. However, it is not useful when there is no finite invariant measure. There are certain successful extensions of the notion of entropy to infinite measure spaces, or transformations with ...
متن کاملA new approach factor- entropy with application to business costs of SMEs in Shanghai
Business cost is acknowledged as one of the priorities in SMEs research. In thisstudy, the business cost of SMEs in Shanghai was primarily measured using Factor-Entropy analysis method. The purpose of this study is to effectively resolve the issueof simplification and assignment evaluation index system on business costs of SMEsin Shanghai. However, this study uses factor analysis to interpret t...
متن کاملSome properties of the parametric relative operator entropy
The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...
متن کاملSome Results Based on Entropy Properties of Progressive Type-II Censored Data
In many life-testing and reliability studies, the experimenter might not always obtain complete information on failure times for all experimental units. One of the most common censoring schemes is progressive type-II censoring. The aim of this paper is characterizing the parent distributions based on Shannon entropy of progressive type-II censored order statistics. It is shown that the equality...
متن کامل